Tensor Networks for Latent Variable Analysis. Part I: Algorithms for Tensor Train Decomposition

نویسندگان

  • Anh Huy Phan
  • Andrzej Cichocki
  • André Uschmajew
  • Petr Tichavský
  • Gheorghe Luta
  • Danilo P. Mandic
چکیده

Decompositions of tensors into factor matrices, which interact through a core tensor, have found numerous applications in signal processing and machine learning. A more general tensor model which represents data as an ordered network of sub-tensors of order-2 or order-3 has, so far, not been widely considered in these fields, although this so-called tensor network decomposition has been long studied in quantum physics and scientific computing. In this study, we present novel algorithms and applications of tensor network decompositions, with a particular focus on the tensor train decomposition and its variants. The novel algorithms developed for the tensor train decomposition update, in an alternating way, one or several core tensors at each iteration, and exhibit enhanced mathematical tractability and scalability to exceedingly large-scale data tensors. The proposed algorithms are tested in classic paradigms of blind source separation from a single mixture, denoising, and feature extraction, and achieve superior performance over the widely used truncated algorithms for tensor train decomposition.

منابع مشابه

Tensor Ring Decomposition

Tensor networks have in recent years emerged as the powerful tools for solving the large-scale optimization problems. One of the most popular tensor network is tensor train (TT) decomposition that acts as the building blocks for the complicated tensor networks. However, the TT decomposition highly depends on permutations of tensor dimensions, due to its strictly sequential multilinear products ...

متن کامل

Hierarchical Tensor Decomposition of Latent Tree Graphical Models

We approach the problem of estimating the parameters of a latent tree graphical model from a hierarchical tensor decomposition point of view. In this new view, the marginal probability table of the observed variables is treated as a tensor, and we show that: (i) the latent variables induce low rank structures in various matricizations of the tensor; (ii) this collection of low rank matricizatio...

متن کامل

Sample Complexity Analysis for Learning Overcomplete Latent Variable Models through Tensor Methods

We provide guarantees for learning latent variable models emphasizing on the overcomplete regime, where the dimensionality of the latent space can exceed the observed dimensionality. In particular, we consider multiview mixtures, spherical Gaussian mixtures, ICA, and sparse coding models. We provide tight concentration bounds for empirical moments through novel covering arguments. We analyze pa...

متن کامل

A simple form of MT impedance tensor analysis to simplify its decomposition to remove the effects of near surface small-scale 3-D conductivity structures

Magnetotelluric (MT) is a natural electromagnetic (EM) technique which is used for geothermal, petroleum, geotechnical, groundwater and mineral exploration. MT is also routinely used for mapping of deep subsurface structures. In this method, the measured regional complex impedance tensor (Z) is substantially distorted by any topographical feature or small-scale near-surface, three-dimensional (...

متن کامل

Provable Algorithms for Machine Learning Problems

Modern machine learning algorithms can extract useful information from text, images and videos. All these applications involve solving NP-hard problems in average case using heuristics. What properties of the input allow it to be solved efficiently? Theoretically analyzing the heuristics is often very challenging. Few results were known. This thesis takes a different approach: we identify natur...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

متن کامل
عنوان ژورنال:
  • CoRR

دوره abs/1609.09230  شماره 

صفحات  -

تاریخ انتشار 2016